Kullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions
نویسندگان
چکیده
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these distributions. Finally, we applied our results on a seismological catalogue data set related to the 2010 Maule earthquake. Specifically, we compare the distributions of the local magnitudes of the regions formed by the aftershocks.
منابع مشابه
Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series
The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In this paper, we consider a class of asymmetric distributions with a normal kernel, called Generalized Skew-Normal...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملMultivariate normal distribution - Wikipedia, the free encyclopedia
1 General case 1.1 Cumulative distribution function 1.2 A counterexample 1.3 Normally distributed and independent 2 Bivariate case 3 Affine transformation 4 Geometric interpretation 5 Correlations and independence 6 Higher moments 7 Conditional distributions 8 Fisher information matrix 9 Kullback-Leibler divergence 10 Estimation of parameters 11 Entropy 12 Multivariate normality tests 13 Drawin...
متن کاملAssessing the effect of kurtosis deviations from Gaussianity on conditional distributions
Keywords: Multivariate exponential power distributions Kurtosis Kullback–Leibler divergence Relative sensitivity a b s t r a c t The multivariate exponential power family is considered for n-dimensional random variables , Z, with a known partition Z ðY; XÞ of dimensions p and n À p, respectively, with interest focusing on the conditional distribution YjX. An infinitesimal variation of any param...
متن کاملConvergence of latent mixing measures in finite and infinite mixture models
We consider Wasserstein distances for assessing the convergence of latent discrete measures, which serve as mixing distributions in hierarchical and nonparametric mixture models. We clarify the relationships between Wasserstein distances of mixing distributions and f -divergence functionals such as Hellinger and Kullback-Leibler distances on the space of mixture distributions using various iden...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Entropy
دوره 14 شماره
صفحات -
تاریخ انتشار 2012